Assessing Prediction Error of Nonparametric Regression and Classification under Bregman Divergence
نویسندگان
چکیده
Prediction error is critical to assessing the performance of statistical methods and selecting statistical models. We propose the cross-validation and approximated cross-validation methods for estimating prediction error under a broad q-class of Bregman divergence for error measures which embeds nearly all of the commonly used loss functions in regression, classification procedures and machine learning literature. The approximated cross-validation formulas are analytically derived, which facilitate fast estimation of prediction error under the Bregman divergence. We then study a data-driven optimal bandwidth selector for the local-likelihood estimation that minimizes the overall prediction error or equivalently the covariance penalty. It is shown that the covariance penalty and cross-validation methods converge to the same mean-prediction-errorcriterion. We also propose a lower-bound scheme for computing the local logistic regression estimates and demonstrate that it is as simple and stable as the local least-squares regression estimation. The algorithm monotonically enhances the target local-likelihood and converges. The idea and methods are extended to the generalized varying-coefficient models and semiparametric models.
منابع مشابه
Penalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملGeneralized Optimal Reverse Prediction
Recently it has been shown that classical supervised and unsupervised training methods can be unified as special cases of so-called “optimal reverse prediction”: predicting inputs from target labels while optimizing over both model parameters and missing labels. Although this perspective establishes links between classical training principles, the existing formulation only applies to linear pre...
متن کاملTransductive De-Noising and Dimensionality Reduction using Total Bregman Regression
Our goal on one hand is to use labels or other forms of ground truth data to guide the tasks of de-noising and dimensionality reduction and balance the objectives of better prediction and better data summarization, on the other hand it is to explicitly model the noise in the feature values. We use a generalization of L2 loss, on which PCA and K-Means are based, to the Bregman family which, as a...
متن کاملA Market Framework for Eliciting Private Data
We propose a mechanism for purchasing information from a sequence of participants. The participants may simply hold data points they wish to sell, or may have more sophisticated information; either way, they are incentivized to participate as long as they believe their data points are representative or their information will improve the mechanism’s future prediction on a test set. The mechanism...
متن کاملComparison of artificial neural network with logistic regression in prediction of tendency to surgical intervention in nurses
Introduction: Logistic regression is one of the modeling methods for bipartite dependent variables. On the other hand, artificial neural network is a flexible method with the least limitation. The importance of growing unnecessary beauty surgeries and the importance of prediction and classification made us consider the present study, with the aim of comparing logistic regression and artificial ...
متن کامل